Error Analysis of the Fixed Point RLS Algorithm

نویسندگان

  • Tulay Adali
  • Sasan H. Ardalan
چکیده

In this report, the steady state mea.n square prediction error is derived for the fixed point RLS (Recursive Least Squares) algorithm, both for the exponentially windowed RLS (forgetting factor, , < 1), and the prewindowed growing memory RLS (, == 1) for correlated inputs. It is shown that signal correlation enhances the excess error due to additive noise and roundoff noise in the desired signal prediction computation. However, correlation has no effect on the noise due to roundoff of the weight error update recursion, which is the error term leading to the di vergence of the algori thrn for , == 1. Also, it is shown that convergence rate of the algorithm depends on the filter order, choice of the forgetting factor and most important of all on the eigenvalue spread of the data. Convergence is slower if the data is highly correlated, i.e. has a large eigenvalue spread.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sensitivity Analysis of Transversal RLS Algorithms With Correlated Inputs

In this contribution it is shown that the sensitivity analysis predicts that the mean square excess error is the same for both correlated and white inputs. Correlation increases the variance about the mean square excess error. We also present a stable finite precision RLS algorithm. Many adaptive filtering problems can be cast as a systems identification problem. Consider the desired signal d(n...

متن کامل

Infinite Precision Analysis of the Fast or Algorithms Based on Backward Prediction Errors

The conventional QR Decomposition Recursive Least Squares (QRD-RLS) method requires the order of X 2 multiplications-O[1\,JJ-per output sample. Nevertheless. a number of Fast QRD-RLS algorithms have been proposed with O[lYJ of complexity. Particularly thc Fast QRD-RLS al­ gorithms based on backward prediction enol'S are well known for their good numerical behaviors and low complexities. In such...

متن کامل

A Family of Recursive Least-squares Adaptive Algorithms Suitable for Fixed-point Implementation

The main feature of the least-squares adaptive algorithms is their high convergence rate. Unfortunately, they encounter numerical problems in finite precision implementation and especially in fixed-point arithmetic. The objective of this paper is twofold. First, an analysis of the finite precision effects of the recursive least-squares (RLS) algorithm is performed, outlining some specific probl...

متن کامل

A New Stabilization Technique for the Fixed - Point Prewindowed Rls Algorithm

In this correspondence, a stable finite precision Recursive Least Squares (RLS) algorithm is derived for the prewindowed growing memory case (forgetting factor, , == 1). Previously, it has been shown that the prewindowed growing memory RLS algorithm diverges under fixed-point implementation [2,1]. The random walk phenomenon due to roundoff errors in the weight update causes the divergence of th...

متن کامل

Square-Root Algorithms of Recursive Least-Squares Wiener Estimators in Linear Discrete-Time Stochastic Systems

This paper addresses the QR decomposition and UD factorization based square-root algorithms of the recursive least-squares (RLS) Wiener fixed-point smoother and filter. In the RLS Wiener estimators, the Riccati-type difference equations for the auto-variance function of the filtering estimate are included. Hence, by the roundoff errors, in the case of the small value of the observation noise va...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007